103 research outputs found

    Spatial contrast sensitivity in adolescents with autism spectrum disorders

    Get PDF
    Adolescents with autism spectrum disorders (ASD) and typically developing (TD) controls underwent a rigorous psychophysical assessment that measured contrast sensitivity to seven spatial frequencies (0.5-20 cycles/degree). A contrast sensitivity function (CSF) was then fitted for each participant, from which four measures were obtained: visual acuity, peak spatial frequency, peak contrast sensitivity, and contrast sensitivity at a low spatial frequency. There were no group differences on any of the four CSF measures, indicating no differential spatial frequency processing in ASD. Although it has been suggested that detail-oriented visual perception in individuals with ASD may be a result of differential sensitivities to low versus high spatial frequencies, the current study finds no evidence to support this hypothesis

    Depth cues and perceived audiovisual synchrony of biological motion

    Get PDF
    Due to their different propagation times, visual and auditory signals from external events arrive at the human sensory receptors with a disparate delay. This delay consistently varies with distance, but, despite such variability, most events are perceived as synchronic. There is, however, contradictory data and claims regarding the existence of compensatory mechanisms for distance in simultaneity judgments. Principal Findings: In this paper we have used familiar audiovisual events – a visual walker and footstep sounds – and manipulated the number of depth cues. In a simultaneity judgment task we presented a large range of stimulus onset asynchronies corresponding to distances of up to 35 meters. We found an effect of distance over the simultaneity estimates, with greater distances requiring larger stimulus onset asynchronies, and vision always leading. This effect was stronger when both visual and auditory cues were present but was interestingly not found when depth cues were impoverished. Significance: These findings reveal that there should be an internal mechanism to compensate for audiovisual delays, which critically depends on the depth information available.FEDERFundação para a Ciência e a Tecnologia (FCT

    Opposite Influence of Perceptual Memory on Initial and Prolonged Perception of Sensory Ambiguity

    Get PDF
    Observers continually make unconscious inferences about the state of the world based on ambiguous sensory information. This process of perceptual decision-making may be optimized by learning from experience. We investigated the influence of previous perceptual experience on the interpretation of ambiguous visual information. Observers were pre-exposed to a perceptually stabilized sequence of an ambiguous structure-from-motion stimulus by means of intermittent presentation. At the subsequent re-appearance of the same ambiguous stimulus perception was initially biased toward the previously stabilized perceptual interpretation. However, prolonged viewing revealed a bias toward the alternative perceptual interpretation. The prevalence of the alternative percept during ongoing viewing was largely due to increased durations of this percept, as there was no reliable decrease in the durations of the pre-exposed percept. Moreover, the duration of the alternative percept was modulated by the specific characteristics of the pre-exposure, whereas the durations of the pre-exposed percept were not. The increase in duration of the alternative percept was larger when the pre-exposure had lasted longer and was larger after ambiguous pre-exposure than after unambiguous pre-exposure. Using a binocular rivalry stimulus we found analogous perceptual biases, while pre-exposure did not affect eye-bias. We conclude that previously perceived interpretations dominate at the onset of ambiguous sensory information, whereas alternative interpretations dominate prolonged viewing. Thus, at first instance ambiguous information seems to be judged using familiar percepts, while re-evaluation later on allows for alternative interpretations

    Multisensory Perceptual Learning of Temporal Order: Audiovisual Learning Transfers to Vision but Not Audition

    Get PDF
    Background: An outstanding question in sensory neuroscience is whether the perceived timing of events is mediated by a central supra-modal timing mechanism, or multiple modality-specific systems. We use a perceptual learning paradigm to address this question. Methodology/Principal Findings: Three groups were trained daily for 10 sessions on an auditory, a visual or a combined audiovisual temporal order judgment (TOJ). Groups were pre-tested on a range TOJ tasks within and between their group modality prior to learning so that transfer of any learning from the trained task could be measured by post-testing other tasks. Robust TOJ learning (reduced temporal order discrimination thresholds) occurred for all groups, although auditory learning (dichotic 500/2000 Hz tones) was slightly weaker than visual learning (lateralised grating patches). Crossmodal TOJs also displayed robust learning. Post-testing revealed that improvements in temporal resolution acquired during visual learning transferred within modality to other retinotopic locations and orientations, but not to auditory or crossmodal tasks. Auditory learning did not transfer to visual or crossmodal tasks, and neither did it transfer within audition to another frequency pair. In an interesting asymmetry, crossmodal learning transferred to all visual tasks but not to auditory tasks. Finally, in all conditions, learning to make TOJs for stimulus onsets did not transfer at all to discriminating temporal offsets. These data present a complex picture of timing processes

    Audiotactile interactions in temporal perception

    Full text link

    Sources of variability in interceptive movements

    Get PDF
    In order to successfully intercept a moving target one must be at the right place at the right time. But simply being there is seldom enough. One usually needs to make contact in a certain manner, for instance to hit the target in a certain direction. How this is best achieved depends on the exact task, but to get an idea of what factors may limit performance we asked people to hit a moving virtual disk through a virtual goal, and analysed the spatial and temporal variability in the way in which they did so. We estimated that for our task the standard deviations in timing and spatial accuracy are about 20 ms and 5 mm. Additional variability arises from individual movements being planned slightly differently and being adjusted during execution. We argue that the way that our subjects moved was precisely tailored to the task demands, and that the movement accuracy is not only limited by the muscles and their activation, but also-and probably even mainly-by the resolution of visual perception
    • …
    corecore